HEMP: High-order entropy minimization for neural network compression
نویسندگان
چکیده
We formulate the entropy of a quantized artificial neural network as differentiable function that can be plugged regularization term into cost minimized by gradient descent. Our formulation scales efficiently beyond first order and is agnostic quantization scheme. The then trained to minimize parameters, so they optimally compressed via coding. experiment with our at quantizing compressing well-known architectures over multiple datasets. approach compares favorably similar methods, enjoying benefits higher estimate, showing flexibility towards non-uniform (we use Lloyd-max quantization), scalability any efficiency in terms compression. show HEMP able work synergy other approaches aiming pruning or model itself, delivering significant storage size compressibility without harming model's performance.
منابع مشابه
Neural Network Classification Using Error Entropy Minimization
One way of using the entropy criteria in learning systems is to minimize the entropy of the error between two variables: typically, one is the output of the learning system and the other is the target. This framework has been used for regression. In this paper we show how to use the minimization of the entropy of the error for classification. The minimization of the entropy of the error implies...
متن کاملThe Error Entropy Minimization Algorithm for Neural Network Classification
One way of using the entropy criteria in learning systems is to minimize the entropy of the error between two variables: typically, one is the output of the learning system and the other is the target. This framework has been used for regression. In this paper we show how to use the minimization of the entropy of the error for classification. The minimization of the entropy of the error implies...
متن کاملAdaptive Neural Network Method for Consensus Tracking of High-Order Mimo Nonlinear Multi-Agent Systems
This paper is concerned with the consensus tracking problem of high order MIMO nonlinear multi-agent systems. The agents must follow a leader node in presence of unknown dynamics and uncertain external disturbances. The communication network topology of agents is assumed to be a fixed undirected graph. A distributed adaptive control method is proposed to solve the consensus problem utilizing re...
متن کاملHigh-order MS CMAC neural network
A macro structure cerebellar model articulation controller (CMAC) or MS_CMAC was developed by connecting several one-dimensional (1-D) CMACs as a tree structure, which decomposes a multidimensional problem into a set of 1-D subproblems, to reduce the computational complexity in multidimensional CMAC. Additionally, a trapezium scheme is proposed to assist MS_CMAC to model nonlinear systems. Howe...
متن کاملAn High-Order Graph Generating Neural Network
A large class of neural network models have their units organized in a lattice with fixed topology or generate their topology during the learning process (usually unsupervised). These network models can be used as neighborhood preserving map and some of them generate a perfect topology preserving map of the input manifold using competitive Hebbian rule. But such a structure is difficult to mana...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Neurocomputing
سال: 2021
ISSN: ['0925-2312', '1872-8286']
DOI: https://doi.org/10.1016/j.neucom.2021.07.022